317 research outputs found

    Auditory affective processing requires awareness

    Get PDF
    Recent work has challenged the previously widely accepted belief that affective processing does not require awareness and can be carried out with more limited resources than semantic processing. This debate has focused exclusively on visual perception, even though evidence from both human and animal studies suggests that existence for nonconscious affective processing would be physiologically more feasible in the auditory system. Here we contrast affective and semantic processing of nonverbal emotional vocalizations under different levels of awareness in three experiments, using explicit (two-alternative forced choice masked affective and semantic categorization tasks, Experiments 1 and 2) and implicit (masked affective and semantic priming, Experiment 3) measures. Identical stimuli and design were used in the semantic and affective tasks. Awareness was manipulated by altering stimulus-mask signal-to-noise ratio during continuous auditory masking. Stimulus awareness was measured on each trial using a four-point perceptual awareness scale. In explicit tasks, neither affective nor semantic categorization could be performed in the complete absence of awareness, while both tasks could be performed above chance level when stimuli were consciously perceived. Semantic categorization was faster than affective evaluation. When the stimuli were partially perceived, semantic categorization accuracy exceeded affective evaluation accuracy. In implicit tasks neither affective nor semantic priming occurred in the complete absence of awareness, whereas both affective and semantic priming emerged when participants were aware of the primes. We conclude that auditory semantic processing is faster than affective processing, and that both affective and semantic auditory processing are dependent on awareness

    Carnal pleasures

    Get PDF
    Pleasures are tightly intertwined with the body. Enjoyment derived from sex, feeding and social touch originate from somatosensory and gustatory processing, and pleasant emotions also markedly influence bodily states tied to the reproductive, digestive, skeletomuscular, and endocrine systems. Here, we review recent research on bodily pleasures, focussing on consummatory sensory pleasures. We discuss how different pleasures have distinct sensory inputs and behavioural outputs and review the data on the role of the somatosensory and interoceptive systems in social bonding. Finally, we review the role of gustatory pleasures in feeding and obesity, and discuss the underlying pathophysiological mechanisms. We conclude that different pleasures have distinct inputs and specific outputs, and that their regulatory functions should be understood in light of these specific profiles in addition to generic reward mechanisms.Social decision makin

    Connectivity Analysis Reveals a Cortical Network for Eye Gaze Perception

    Get PDF
    Haxby et al. (Haxby JV, Hoffman EA, Gobbini MI. 2000. The distributed human neural system for face perception. Trends Cogn Sci. 4:223–233.) proposed that eye gaze processing results from an interaction between a “core” face-specific system involved in visual analysis and an “extended” system involved in spatial attention, more generally. However, the full gaze perception network has remained poorly specified. In the context of a functional magnetic resonance imaging study, we used psychophysiological interactions (PPIs) to identify brain regions that showed differential connectivity (correlation) with core face perception structures (posterior superior temporal sulcus [pSTS] and fusiform gyrus [FG]) when viewing gaze shifts relative to control eye movements (opening/closing the eyes). The PPIs identified altered connectivity between the pSTS and MT/V5, intraparietal sulcus, frontal eye fields, superior temporal gyrus (STG), supramarginal gyrus, and middle frontal gyrus (MFG). The FG showed altered connectivity with the same areas of the STG and MFG, demonstrating the contribution of both dorsal and ventral core face areas to gaze perception. We propose that this network provides an interactive system that alerts us to seen changes in other agents’ gaze direction, makes us aware of their altered focus of spatial attention, and prepares a corresponding shift in our own attention

    Eye Contact Judgment Is Influenced by Perceivers' Social Anxiety But Not by Their Affective State

    Get PDF
    Fast and accurate judgment of whether another person is making eye contact or not is crucial for our social interaction. As affective states have been shown to influence social perceptions and judgments, we investigated the influence of observers' own affective states and trait anxiety on their eye contact judgments. In two experiments, participants were required to judge whether animated faces (Experiment 1) and real faces (Experiment 2) with varying gaze angles were looking at them or not. Participants performed the task in pleasant, neutral, and unpleasant odor conditions. The results from two experiments showed that eye contact judgments were not modulated by observers' affective state, yet participants with higher levels of social anxiety accepted a wider range of gaze deviations from the direct gaze as eye contact. We conclude that gaze direction judgments depend on individual differences in affective predispositions, yet they are not amenable to situational affective influences

    Mental Action Simulation Synchronizes Action-Observation Circuits across Individuals

    Get PDF
    A frontoparietal action–observation network (AON) has been proposed to support understanding others' actions and goals. We show that the AON "ticks together" in human subjects who are sharing a third person's feelings. During functional magnetic resonance imaging, 20 volunteers watched movies depicting boxing matches passively or while simulating a prespecified boxer's feelings. Instantaneous intersubject phase synchronization (ISPS) was computed to derive multisubject voxelwise similarity of hemodynamic activity and inter-area functional connectivity. During passive viewing, subjects' brain activity was synchronized in sensory projection and posterior temporal cortices. Simulation induced widespread increase of ISPS in the AON (premotor, posterior parietal, and superior temporal cortices), primary and secondary somatosensory cortices, and the dorsal attention circuits (frontal eye fields, intraparietal sulcus). Moreover, interconnectivity of these regions strengthened during simulation. We propose that sharing a third person's feelings synchronizes the observer's own brain mechanisms supporting sensations and motor planning, thereby likely promoting mutual understanding.Peer reviewe

    Maps of subjective feelings

    Get PDF
    Subjective feelings are a central feature of human life. We defined the organization and determinants of a feeling space involving 100 core feelings that ranged from cognitive and affective processes to somatic sensations and common illnesses. The feeling space was determined by a combination of basic dimension rating, similarity mapping, bodily sensation mapping, and neuroimaging meta-analysis. A total of 1,026 participants took part in online surveys where we assessed (i) for each feeling, the intensity of four hypothesized basic dimensions (mental experience, bodily sensation, emotion, and controllability), (ii) subjectively experienced similarity of the 100 feelings, and (iii) topography of bodily sensations associated with each feeling. Neural similarity between a subset of the feeling states was derived from the NeuroSynth meta-analysis database based on the data from 9,821 brain-imaging studies. All feelings were emotionally valenced and the saliency of bodily sensations correlated with the saliency of mental experiences associated with each feeling. Nonlinear dimensionality reduction revealed five feeling clusters: positive emotions, negative emotions, cognitive processes, somatic states and illnesses, and homeostatic states. Organization of the feeling space was best explained by basic dimensions of emotional valence, mental experiences, and bodily sensations. Subjectively felt similarity of feelings was associated with basic feeling dimensions and the topography of the corresponding bodily sensations. These findings reveal a map of subjective feelings that are categorical, emotional, and embodied.</p

    Decoding brain basis of laughter and crying in natural scenes

    Get PDF
    Laughter and crying are universal signals of prosociality and distress, respectively. Here we investigated the functional brain basis of perceiving laughter and crying using naturalistic functional magnetic resonance imaging (fMRI) approach. We measured haemodynamic brain activity evoked by laughter and crying in three experiments with 100 subjects in each. The subjects i) viewed a 20-minute medley of short video clips, and ii) 30 min of a full-length feature film, and iii) listened to 13.5 min of a radio play that all contained bursts of laughter and crying. Intensity of laughing and crying in the videos and radio play was annotated by independent observes, and the resulting time series were used to predict hemodynamic activity to laughter and crying episodes. Multivariate pattern analysis (MVPA) was used to test for regional selectivity in laughter and crying evoked activations. Laughter induced widespread activity in ventral visual cortex and superior and middle temporal and motor cortices. Crying activated thalamus, cingulate cortex along the anterior-posterior axis, insula and orbitofrontal cortex. Both laughter and crying could be decoded accurately (66–77% depending on the experiment) from the BOLD signal, and the voxels contributing most significantly to classification were in superior temporal cortex. These results suggest that perceiving laughter and crying engage distinct neural networks, whose activity suppresses each other to manage appropriate behavioral responses to others’ bonding and distress signals

    Affective Adaptation to Repeated SIT and MICT Protocols in Insulin-Resistant Subjects

    Get PDF
    Introduction The aim of this study was to investigate affective responses to repeated sessions of sprint interval training (SIT) in comparison with moderate-intensity continuous training (MICT) in insulin-resistant subjects.Methods Twenty-six insulin-resistant adults (age, 49 (4) yr; 10 women) were randomized into SIT (n = 13) or MICT (n = 13) groups. Subjects completed six supervised training sessions within 2 wk (SIT session, 4-6 x 30 s all-out cycling/4-min recovery; MICT session, 40-60 min at 60% peak work load). Perceived exertion, stress, and affective state were assessed with questionnaires before, during and after each training session.Results Perceived exertion, displeasure, and arousal were higher during the SIT compared with MICT sessions (all P 0.05).Conclusions The perceptual and affective responses are more negative both during and acutely after SIT compared with MICT in untrained insulin-resistant adults. These responses, however, show significant improvements already within six training sessions, indicating rapid positive affective and physiological adaptations to continual exercise training, both SIT and MICT. These findings suggest that even very intense SIT is mentally tolerable alternative for untrained people with insulin resistance

    Autism spectrum traits predict the neural response to eye gaze in typical individuals

    Get PDF
    Autism Spectrum Disorders (ASD) are neurodevelopmental disorders characterised by impaired social interaction and communication, restricted interests and repetitive behaviours. The severity of these characteristics are posited to lie on a continuum extending into the typical population, and typical adults' performance on behavioural tasks that are impaired in ASD is correlated with the extent to which they display autistic traits (as measured by Autism Spectrum Quotient, AQ). Individuals with ASD also show structural and functional differences in brain regions involved in social perception. Here we show that variation in AQ in typically developing individuals is associated with altered brain activity in the neural circuit for social attention perception while viewing others' eye gaze. In an fMRI experiment, participants viewed faces looking at variable or constant directions. In control conditions, only the eye region was presented or the heads were shown with eyes closed but oriented at variable or constant directions. The response to faces with variable vs. constant eye gaze direction was associated with AQ scores in a number of regions (posterior superior temporal sulcus, intraparietal sulcus, temporoparietal junction, amygdala, and MT/VS) of the brain network for social attention perception. No such effect was observed for heads with eyes closed or when only the eyes were presented. The results demonstrate a relationship between neurophysiology and autism spectrum traits in the typical (non-ASD) population and suggest that changes in the functioning of the neural circuit for social attention perception is associated with an extended autism spectrum in the typical population. (C) 2011 Elsevier Inc. All rights reserved

    Neurons in the human amygdala encode face identity, but not gaze direction

    Get PDF
    The amygdala is important for face processing, and direction of eye gaze is one of the most socially salient facial signals. Recording from over 200 neurons in the amygdala of neurosurgical patients, we found robust encoding of the identity of neutral-expression faces, but not of their direction of gaze. Processing of gaze direction may rely on a predominantly cortical network rather than the amygdala
    • …
    corecore